36 research outputs found

    A simulation-optimization framework for reducing thermal pollution downstream of reservoirs

    Get PDF
    Thermal pollution is an environmental impact of large dams altering the natural temperature regime of downstream river ecosystems. The present study proposes a simulation-optimization framework to reduce thermal pollution downstream from reservoirs and tests it on a real-world case study. This framework attempts to simultaneously minimize the environmental impacts as well as losses to reservoir objectives for water supply. A hybrid machine-learning model is applied to simulate water temperature downstream of the reservoir under various operation scenarios. This model is shown to be robust and achieves acceptable predictive accuracy. The results of simulation-optimization indicate that the reservoir could be operated in such a way that the natural temperature regime is reasonably preserved to protect downstream habitats. Doing so, however, would result in significant trade-offs for reservoir storage and water supply objectives. Such trade-offs can undermine the benefits of reservoirs and need to be carefully considered in reservoir design and operation

    Developing Efficient Strategies for Automatic Calibration of Computationally Intensive Environmental Models

    Get PDF
    Environmental simulation models have been playing a key role in civil and environmental engineering decision making processes for decades. The utility of an environmental model depends on how well the model is structured and calibrated. Model calibration is typically in an automated form where the simulation model is linked to a search mechanism (e.g., an optimization algorithm) such that the search mechanism iteratively generates many parameter sets (e.g., thousands of parameter sets) and evaluates them through running the model in an attempt to minimize differences between observed data and corresponding model outputs. The challenge rises when the environmental model is computationally intensive to run (with run-times of minutes to hours, for example) as then any automatic calibration attempt would impose a large computational burden. Such a challenge may make the model users accept sub-optimal solutions and not achieve the best model performance. The objective of this thesis is to develop innovative strategies to circumvent the computational burden associated with automatic calibration of computationally intensive environmental models. The first main contribution of this thesis is developing a strategy called “deterministic model preemption” which opportunistically evades unnecessary model evaluations in the course of a calibration experiment and can save a significant portion of the computational budget (even as much as 90% in some cases). Model preemption monitors the intermediate simulation results while the model is running and terminates (i.e., pre-empts) the simulation early if it recognizes that further running the model would not guide the search mechanism. This strategy is applicable to a range of automatic calibration algorithms (i.e., search mechanisms) and is deterministic in that it leads to exactly the same calibration results as when preemption is not applied. One other main contribution of this thesis is developing and utilizing the concept of “surrogate data” which is basically a reasonably small but representative proportion of a full set of calibration data. This concept is inspired by the existing surrogate modelling strategies where a surrogate model (also called a metamodel) is developed and utilized as a fast-to-run substitute of an original computationally intensive model. A framework is developed to efficiently calibrate hydrologic models to the full set of calibration data while running the original model only on surrogate data for the majority of candidate parameter sets, a strategy which leads to considerable computational saving. To this end, mapping relationships are developed to approximate the model performance on the full data based on the model performance on surrogate data. This framework can be applicable to the calibration of any environmental model where appropriate surrogate data and mapping relationships can be identified. As another main contribution, this thesis critically reviews and evaluates the large body of literature on surrogate modelling strategies from various disciplines as they are the most commonly used methods to relieve the computational burden associated with computationally intensive simulation models. To reliably evaluate these strategies, a comparative assessment and benchmarking framework is developed which presents a clear computational budget dependent definition for the success/failure of surrogate modelling strategies. Two large families of surrogate modelling strategies are critically scrutinized and evaluated: “response surface surrogate” modelling which involves statistical or data–driven function approximation techniques (e.g., kriging, radial basis functions, and neural networks) and “lower-fidelity physically-based surrogate” modelling strategies which develop and utilize simplified models of the original system (e.g., a groundwater model with a coarse mesh). This thesis raises fundamental concerns about response surface surrogate modelling and demonstrates that, although they might be less efficient, lower-fidelity physically-based surrogates are generally more reliable as they to-some-extent preserve the physics involved in the original model. Five different surface water and groundwater models are used across this thesis to test the performance of the developed strategies and elaborate the discussions. However, the strategies developed are typically simulation-model-independent and can be applied to the calibration of any computationally intensive simulation model that has the required characteristics. This thesis leaves the reader with a suite of strategies for efficient calibration of computationally intensive environmental models while providing some guidance on how to select, implement, and evaluate the appropriate strategy for a given environmental model calibration problem

    Sensitivity analysis: A discipline coming of age

    Get PDF
    Sensitivity analysis (SA) as a ‘formal’ and ‘standard’ component of scientific development and policy support is relatively young. Many researchers and practitioners from a wide range of disciplines have contributed to SA over the last three decades, and the SAMO (sensitivity analysis of model output) conferences, since 1995, have been the primary driver of breeding a community culture in this heterogeneous population. Now, SA is evolving into a mature and independent field of science, indeed a discipline with emerging applications extending well into new areas such as data science and machine learning. At this growth stage, the present editorial leads a special issue consisting of one Position Paper on “The future of sensitivity analysis” and 11 research papers on “Sensitivity analysis for environmental modelling” published in Environmental Modelling & Software in 2020–21.publishedVersio

    Reconstruction Of Paleo-Hydrologic Data For Vulnerability Assessment Of Water Resources Systems

    Full text link
    Tree-ring chronologies are a rich source of information of past climate-driven non-stationarities in hydrologic variables. They are typically directly related to available water in respective years, thereby providing a basis for paleo-hydrology reconstruction. This study investigates the time series of tree-ring chronologies, with the objective of identifying the spatiotemporal patterns and extents of non-stationarities, which are essentially representations of past “climate changes”. This study also generates ensembles of moving-average streamflow time series for the centuries prior to the period of observational record. The major headwater tributaries of the Saskatchewan River basin (SaskRB), the main source of surface water in the Canadian Prairie Provinces, are used as the case study. This extended abstract gives a brief summary of the methodology and some examples of the results. The analyses and results show how the reconstruction of paleo-hydrology broadens the understanding of hydrologic characteristics of a basin beyond the limited observational records, and therefore, provides a basis for more reliable assessment and management of available water resources

    Estimation of Effective Doses and Lifetime Risk of Exposure-induced Cancer Death in Pediatric CT Scans

    Get PDF
    Background: The increasing frequency of computed tomography (CT) scans for a range of purposes, particularly pediatrics, has raised concerns regarding the population's radiation exposure and subsequent chances of cancers. This study aimed to estimate the effective doses of pediatrics radiation and induced cancer risks from five most common CT scan procedures in Yazd Province, Iran.Methods: Data of pediatric patients from four age groups of ≀1, 1-5, 5-10, and 10-15 years old were retrospectively collected from 6 educational institutions located in diverse areas of Yazd Province. For each participant, the effective doses and REID (risk of exposure-induced death) rate were estimated by Impact Dose and PCXMC software, respectively. Then, the findings were reported by categorizing the patients regarding their effective diameter.Results: The effective doses and REID values did not show any significant differences among the studied age groups. The highest mean of effective dose was recorded for the scan of abdomen-pelvis (average ± standard deviation, 5.24±3.19 mSv) followed by chest (3.76±2.28 mSv), brain (1.25±1.07 mSv), and sinus (0.65±0.4 mSv) examinations. The highest REID was documented for chest scan (490±314 excess deaths in one million scans) followed by abdomen-pelvis procedure (404±280).Conclusion: The radiation doses delivered to the pediatric patients and the associated fatal cancer risk with common CT procedures were comparably in the same range of the previous studies. Our findings can represent an estimation of the radiation-induced risks of CT scans and can be used for extending the knowledge of clinicians and researchers

    Exploding the myths: An introduction to artificial neural networks for prediction and forecasting

    Get PDF
    Artificial Neural Networks (ANNs), sometimes also called models for deep learning, are used extensively for the prediction of a range of environmental variables. While the potential of ANNs is unquestioned, they are surrounded by an air of mystery and intrigue, leading to a lack of understanding of their inner workings. This has led to the perpetuation of a number of myths, resulting in the misconception that applying ANNs primarily involves "throwing" a large amount of data at "black-box" software packages. While this is a convenient way to side-step the principles applied to the development of other types of models, this comes at significant cost in terms of the usefulness of the resulting models. To address these issues, this inroductory overview paper explodes a number of the common myths surrounding the use of ANNs and outlines state-of-the-art approaches to developing ANNs that enable them to be applied with confidence in practice

    On how data are partitioned in model development and evaluation: Confronting the elephant in the room to enhance model generalization

    Get PDF
    Models play a pivotal role in advancing our understanding of Earth\u27s physical nature and environmental systems, aiding in their efficient planning and management. The accuracy and reliability of these models heavily rely on data, which are generally partitioned into subsets for model development and evaluation. Surprisingly, how this partitioning is done is often not justified, even though it determines what model we end up with, how we assess its performance and what decisions we make based on the resulting model outputs. In this study, we shed light on the paramount importance of meticulously considering data partitioning in the model development and evaluation process, and its significant impact on model generalization. We identify flaws in existing data-splitting approaches and propose a forward-looking strategy to effectively confront the “elephant in the room”, leading to improved model generalization capabilities

    Hydrological and economic assessment of the Upper Qu’Appelle Water Supply Project : report for Western Economic Diversification

    Get PDF
    PDFCanada First Research Excellence Fund (CFREF)Non-Peer ReviewedThis report describes some water resource management modeling, water quality modeling, and economic implications of the Upper Qu’Appelle Water Supply Project

    Diagnosis of Historical and Future Flow Regimes of the Bow River at Calgary Using a Dynamically Downscaled Climate Model and a Physically Based Land Surface Hydrological Model : Final Report

    Get PDF
    Final Report developed under Agreement #AP744 for the Natural Resources Canada Climate Change Adaptation Program.Developed under Agreement #AP744 for the Natural Resources Canada Climate Change Adaptation Program, with financial and in-kind assistance from Natural Resources Canada, Alberta Environment and Parks, the City of Calgary, Environment and Climate Change Canada and the Global Water Futures program.Non-Peer ReviewedThis report assesses the impacts of projected climate change on the hydrology, including the flood frequencies, of the Bow and Elbow Rivers above Calgary, Alberta. It reports on investigations of the effects of projected climate change on the runoff mechanisms for the Bow and Elbow River basins, which are important mountain headwaters in Alberta, Canada. The study developed a methodology and applied a case study for incorporating climate change into flood frequency estimates that can be applied to a variety of river basins across Canada
    corecore